9 research outputs found

    Bayesian approaches to trajectory estimation in maritime surveillance

    No full text
    In maritime surveillance, multi-sensor data differ to a great extent in their temporal resolution. Additionally, due to multi-level security and information management processing, many contact reports arrive hours after observations. This makes the contact report data usually available for batch processing. The dissimilar multi-source information environment results in contact reports with heteroscedastic and correlated errors (i.e. measurement errors characterized by normal probability distributions with non-constant and non-diagonal covariance matrices), while the obtained measurement errors may be relatively large. Hence, the appropriate choice of a trajectory estimation algorithm, which addresses the aforementioned issues of the surveillance data, will significantly contribute to increased awareness in the maritime domain. This thesis presents two novel batch single ship trajectory estimation algorithms employing Bayesian approaches to estimation: (1) a stochastic linear filtering algorithm and (2) a curve fitting algorithm which employs Bayesian statistical inference for nonparametric regression. The stochastic linear filtering algorithm employs a combination of two stochastic processes, namely the Integrated Ornstein-Uhlenbeck process (IOU) and the random walk (RW), process to describe the ship's motion. The assumptions on linear modeling and bivariate Gaussian distribution of measurement errors allow for the use of Kalman filtering and Rauch-Tung-Striebel optimal smoothing. In the curve fitting algorithm, the trajectory is considered to be in the form of a cubic spline with an unknown number of knots in two-dimensional Euclidean plane of longitude and latitude. The function estimate is determined from the data which are assumed Gaussian distributed. A fully Bayesian approach is adopted by defining the prior distributions on all unknown parameters: the spline coefficients, the number and the locations of knots. The calculation of the pEn surveillance maritime, les données multi-senseurs diffrent, dans une large mesure, en terme de leur résolution temporelle. De plus, en raison de la gestion du traitement d'information de sécurité multi-niveau, plusieurs rapports de contact sont reçus des heures après leur observation. Ceci rend les données des rapports de contact disponible pour traitement en lot. L'information multi-source provenant d'environnements dissimilaires, les rapports de contact ont des erreurs hétéroscédastiques et corrélées (i.e., des erreurs de mesure caractérisées par une distribution de probabilité normale et une matrice de covariance non constante et non diagonale), ainsi qu'une erreur de mesure pouvant être relativement large. En conséquence, le choix approprié d'un algorithme d'estimation de trajectoire adressant les problèmes susmentionnés de données de surveillance contribuera significativement à accroître la perception de situation dans le domaine maritime. Cette thèse présente deux nouveaux algorithmes pour l'estimation en lot de trajectoire de navire simple et employant une approche d'estimation bayésienne: (1) un algorithme de filtration linéaire stochastique, et (2) un algorithme de lissage de courbe réalisant une régression non-paramétrique par inférence statistique bayésienne. L'algorithme de filtration linéaire stochastique emploi la combinaison de deux processus stochastiques, c'est-à-dire le processus d'Ornstein-Uhlenbeck intégré (IOU) et le processus de marche aléatoire (RW), pour décrire le mouvement du navire. Les suppositions de modèle linéaire et de distribution gaussienne bivariée des erreurs de mesure permettent l'utilisation du filtre de Kalman et du lissage optimal de Rauch-Tung-Striebel. Dans l'algorithme de lissage de courbe, la trajectoire est considérée représentée par un spline cubique avec un nombre de noeuds inconnu dans le plan euclidien à deux dimensions, en longitude et latitude. L'estimation de

    Comparative analysis of the IMM-JVC and the IMM-JPDA algorithms for multiple-target tracking

    No full text
    When tracking closely maneuvering targets, the critical role is played by both the chosen method of data association and the target-tracking algorithm. Without an effective association, state estimation is at risk. Without an efficient state prediction, the performance of an associator can be degraded. In developing an assignment strategy the crucial issue is whether to assign a track or observation as belonging uniquely to another observation or track, or to allow a track to be associated non-uniquely with multiple candidate observations.This thesis presents a comparative study of two assignment alternatives, namely the NC (unique association of a measurement to an existing track) and JPDA (nonunique association of a measurement to an existing track) algorithms. These assignment strategies were combined with an Interacting Multiple Model (IMM) positional estimator, which superiority over the other single scan algorithms has been largely documented. The respective tracking performance of the IMM-JVC and EV1M-JPDAF algorithms for multiple target tracking has been evaluated. After a detailed description of the IMM-JVC and IMM-JPDAF formalisms, and the IMM-JPDAF implementation issues, an analysis of the results of NC association compared to JPDA association is presented. Simulation results obtained on two scenarios involving two closely maneuvering aircraft confirm the superiority of the IMM-JVC

    Akaike and Bayesian Information Criteria for Hidden Markov Models

    No full text

    Track-before detect methods in tracking low-observable targets: a survey

    No full text
    Abstract: In detection and target tracking applications, low-observable targets are defined as those that deliver measurements for which the sensor responses have a value of SNR lower than 10dB. Tracking of low-observable targets is best performed employing the approaches known as trackbefore-detect (TBD). The proper TBD approaches combine signal processing and tracking so that detection and track confirmation occur simultaneously. The thresholding process, in which the observations are produced, is avoided so as to preserve the weak signal information in the raw sensor data. Although a variety of TBD methods have been presented and discussed in the literature, no comprehensive and comparative study of those is yet available. This paper delivers such a discussion with the goal of facilitating proper selection of methods in the context of concrete applications

    Use case design and big data analytics evaluation for fishing monitoring

    No full text
    International audienceThis work reports on the maritime use case development and evaluation of the prototype of the project Big Data Analytics for Time Critical Mobility Forecasting (datAcron), which was funded by the European Union Horizon 2020 programme. datAcron aimed at developing novel methods to detect threats and abnormal activities in streams of large number of moving objects in wide maritime and aerial areas. The datAcron maritime use cases, which were scoped in the Maritime Work Package (WP5), geared on fishing activities monitoring, aligning with the European Union Maritime Security Strategy. The maritime use cases aimed at providing research partners with maritime domain knowledge to frame in some operational context novel approaches designed during the project. Moreover, the maritime use cases provided a basis for the datAcron maritime prototype setup and evaluation of big data solutions. Six scenarios falling under three use cases were therefore developed, identifying relevant data sources, situational indicators as well as corresponding operational tasks. This paper overviews the design, development and evaluation of the datAcron maritime use case

    Bayesian inference for biomarker discovery in proteomics: an analytic solution

    Get PDF
    International audienceThis paper addresses the question of biomarker discovery in proteomics. Given clinical data regarding a list of proteins for a set of individuals, the tackled problem is to extract a short subset of proteins the concentrations of which are an indicator of the biological status (healthy or pathological). In this paper, it is formulated as a specific instance of variable selection. The originality is that the proteins are not investigated one after the other but the best partition between discriminant and non-discriminant proteins is directly sought. In this way, correlations between the proteins are intrinsically taken into account in the decision. The developed strategy is derived in a Bayesian setting, and the decision is optimal in the sense that it minimizes a global mean error. It is finally based on the posterior probabilities of the partitions. The main difficulty is to calculate these probabilities since they are based on the so-called evidence that require marginalization of all the unknown model parameters. Two models are presented that relate the status to the protein concentrations, depending whether the latter are biomarkers or not. The first model accounts for biological variabilities by assuming that the concentrations are Gaussian distributed with a mean and a covariance matrix that depend on the status only for the biomarkers. The second one is an extension that also takes into account the technical variabilities that may significantly impact the observed concentrations. The main contributions of the paper are: (1) a new Bayesian formulation of the biomarker selection problem, (2) the closed-form expression of the posterior probabilities in the noiseless case, and (3) a suitable approximated solution in the noisy case. The methods are numerically assessed and compared to the state-of-the-art methods (t test, LASSO, Battacharyya distance, FOHSIC) on synthetic and real data from proteins quantified in human serum by mass spectrometry in selected reaction monitoring mode

    Composite Event Patterns for Maritime Monitoring

    No full text
    Maritime monitoring systems support safe shipping as they allow for the real-time detection of dangerous, suspicious and illegal vessel activities. We have been developing a composite event recognition system for maritime monitoring in the Event Calculus, allowing both for verification and real-time performance. To increase the accuracy of the system, we have been collaborating with domain experts in order to construct effective patterns of maritime activity. We present some indicative patterns in the Event Calculus, and evaluate them using two forms of real kinematic vessel data

    A Fishing Monitoring Use Case in Support to Collaborative Research

    No full text
    <p>A.-L. Jousselme , E. Camossi , M. Hadzagic , C. Ray, C. Claramunt, E. Reardon , K. Bryan , M. Ilteris, “A Fishing Monitoring Use Case in support of collaborative research”. In proceedings of Maritime Knowledge Discovery and Anomaly Detection Workshop, Ispra, Varese, Italy, July 5-6 2016. p. 57-61.</p
    corecore